Rank structured matrices: theory, algorithms and applications

ثبت نشده
چکیده

In numerical linear algebra much attention has been paid to matrices that are sparse, i.e., containing a lot of zeros. For example, to compute the eigenvalues of a general dense symmetric matrix, this matrix is first reduced to a similar tridiagonal one using an orthogonal similarity transformation. The subsequent QR-algorithm performed on this n×n tridiagonal matrix, takes the sparse structure of this matrix into account resulting in O(n) flops per iteration compared to O(n3) flops for a general dense matrix. Much less attention was given to so-called rank structured matrices although they share similar theoretical and computational properties. As an example one can look at the inverse of a tridiagonal matrix with nonzero subdiagonal elements. This inverse is generically a dense matrix. However, all submatrices taken from the lower (upper) triangular part of this matrix all have rank one. Such a matrix is called a semiseparable matrix. The aim of this series of lectures is to gain insight into this class of matrices and some of its generalizations. These matrices are called rank structured matrices. Rank structured matrices can be loosely defined as matrices that have one or more submatrices of low rank. Similar to sparse matrices where the sparsity pattern can be structured or unstructured, rank structured matrices can be considered having a structured or unstructured pattern of the low rank matrices. We will focus in these lectures on the rank structured matrices with a specific pattern in the low rank submatrices. As time permits, one or more of the following topics will be covered: • In the literature different names and slightly different definitions are given for rank structured matrices. We will study these different definitions. Besides the different definitions also several different representations can be used. We will discuss the advantages and disadvantages of these representations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accurate Solution of Structured Least Squares Problems via Rank-Revealing Decompositions

Least squares problems minx ‖b−Ax‖2 where the matrix A ∈ Cm×n (m ≥ n) has some particular structure arise frequently in applications. Polynomial data fitting is a well-known instance of problems that yield highly structured matrices, but many other examples exist. Very often, structured matrices have huge condition numbers κ2(A) = ‖A‖2 ‖A†‖2 (A† is the Moore-Penrose pseudo-inverse ofA) and, the...

متن کامل

Accurate Symmetric Rank Revealing and Eigendecompositions of Symmetric Structured Matrices

We present new O(n3) algorithms that compute eigenvalues and eigenvectors to high relative accuracy in floating point arithmetic for the following types of matrices: symmetric Cauchy, symmetric diagonally scaled Cauchy, symmetric Vandermonde, and symmetric totally nonnegative matrices when they are given as products of nonnegative bidiagonal factors. The algorithms are divided into two stages: ...

متن کامل

A Givens-Weight Representation for Rank Structured Matrices

In this paper we introduce a Givens-weight representation for rank structured matrices, where the rank structure is defined by certain low rank submatrices starting from the bottom left matrix corner. This representation will be compared to the (block) quasiseparable representations occurring in the literature. We will then provide some basic algorithms for the Givens-weight representation, in ...

متن کامل

Eigenvalue perturbation theory of classes of structured matrices under generic structured rank one perturbations

We study the perturbation theory of structured matrices under structured rank one perturbations, and then focus on several classes of complex matrices. Generic Jordan structures of perturbed matrices are identified. It is shown that the perturbation behavior of the Jordan structures in the case of singular J-Hamiltonian matrices is substantially different from the corresponding theory for unstr...

متن کامل

Estimating a Few Extreme Singular Values and Vectors for Large-Scale Matrices in Tensor Train Format

We propose new algorithms for singular value decomposition (SVD) of very large-scale matrices based on a low-rank tensor approximation technique called the tensor train (TT) format. The proposed algorithms can compute several dominant singular values and corresponding singular vectors for large-scale structured matrices given in a TT format. The computational complexity of the proposed methods ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011